WHAT WE DO KNOW SO FAR ABOUT APPLE'S BATTLE WITH THE FBI.
Source: ITnews.com
This is a complicated case that could stretch on for months, but we'll keep this updated as more news breaks.
At this writing, Apple’s battle with the FBI over how much it can and should help in the investigation of the San Bernardino shootings is less than a week old. But already it’s explosive to say the least. The government has accused Apple of beingmore concerned with marketing than the fight against terrorism, and Apple has drawn a line in the sand, saying that complying with the FBI’s request “would undermine the very freedoms and liberty our government is meant to protect.”
This fight isn’t going to be over anytime soon, so we’ll keep this FAQ updated as events unfold. If you have more questions—or want to respectfully debate the implications this case will have on privacy and security—please chime away in the comments and we’ll do our best to make everything about this confusing case as clear as possible.
Where does it stand right now?
The United States District Court for the Central District of California issued an order on February 16, giving Apple five business days to respond. Apple posted an open letter to customers on its website explaining its side of the case, prompting government attorneys to file a motion on February 19 disagreeing with Apple’s view of the situation, and asking the court to force Apple to comply.
A hearing is scheduled to take place in Riverside, CA, on March 22. Until then, the lawyers will file more motions, while the two sides also take their case to the court of public opinion. On Sunday February 21, FBI Director James Comey posted at Lawfirethat we should "take a deep breath and stop saying the world is ending." Apple updated its open letter on Monday February 22 to add its own FAQ on privacy and security, and Tim Cook sent a memo to employees calling on the FBI to drop their request. So far, public opinion is not on Apple’s side, but this is only the beginning.
The Basics
So the FBI has an iPhone 5c that belonged to the San Bernardino shooter, and they think it has evidence inside?
The iPhone 5c in question was used by San Bernardino shooter Syed Rizwan Farook, but it was his work phone, so it technically belonged to his employer, the San Bernardino County Department of Public Health. Farook also had a personal phone as well as a personal computer, but hephysically destroyed both before the December 2 shooting. Farook was killed in a firefight with police.
In the course of its investigation, the FBI wants to examine the iPhone 5c for evidence. The DOJ’s court filing from Friday February 19 reads: The government has reason to believe that Farook used that iPhone to communicate with some of the very people whom he and [his also-deceased wife Tafsheen] Malik murdered. The phone may contain critical communications and data prior to and around the time of the shooting that, thus far: (1) has not been accessed; (2) may reside solely on the phone; and (3) cannot be accessed by any other means known to either the government or Apple.
But if it was his employer’s phone, can’t they access its data, or at least consent to the search?
The San Bernardino County Department of Health did consent to the search, but the iPhone is locked with a passcode (reportedly a 4-digit pin, not something more complex), and apparently the county didn’t use good multi-device management practices, because they don’t know that passcode and couldn’t access anything on the phone without it. From the same February 19 court filing:
The FBI obtained a warrant to search the iPhone, and the owner of the iPhone, Farook’s employer, also gave the FBI its consent to the search. Because the iPhone was locked, the government subsequently sought Apple’s help in its efforts to execute the lawfully issued search warrant. Apple refused.
Why is Apple refusing to unlock the phone?.
That wasn’t what Apple was asked to do—Apple actually has no way of unlocking a locked iPhone. Apple does have a way to extract data from a device running iOS 7 or earlier, without having to unlock the phone. Apple has done this before for law enforcement with a proper court order—another filing by the government estimates at least 70 times.
But starting with iOS 8, the data on an iPhone is encrypted by default as soon as you enable the passcode feature. Since Farook’s iPhone 5c is running iOS 9, the only way to access the encrypted data it holds is to unlock the phone with the passcode.
Since the owner of the phone (Farook’s employer) doesn’t know the passcode, and Apple doesn’t know the passcode, and Farook is dead, the FBI is stuck trying to crack the passcode through brute force.
What does the FBI want Apple to do to help brute-force the passcode?
The best defense iOS has against a brute-force attack is the Erase Data feature, which will wipe all the data on the iPhone after 10 failed passcode attempts. The iPhone has a 4-digit pin, which shouldn’t take too long to crack, but certainly more than 10 tries.
So the FBI’s request, and the court’s February 16 order, is for Apple to create a sideloadable SIF (software image file) of iOS that can run on the iPhone’s RAM without touching any other data on the device. The FBI wants Apple to sign that software so the iPhone—and only this iPhone—will run it. Once installed, the software would disable that Erase Data setting.
The FBI also wants to try passcodes as quickly as possible, so it wants Apple to disable the delay between passcode attempts, plus allow passcodes to be inputted by a computer, either through the iPhone’s Lightning port or wirelessly, a feature that has never existed in a publicly shipping version of iOS. That’s a big deal—as Matthew Panzarino points out at TechCrunch, it’s asking Apple to introduce a new weakness into iOS.
Does the FBI know for sure if the Erase Data feature is turned on?
It doesn’t seem like it—the FBI just doesn’t want to take any chances. From theFebruary 19 filing, emphasis ours:
The FBI has been unable to make attempts to determine the passcode to access the subject device because Apple has written, or “coded,” its operating system with a user-enabled “auto-erase function” that would, if enabled, result in the permanent destruction of the required encryption key material after 10 failed attempts at entering the correct passcode.
What was Apple’s response?
Apple posted an open letter to customers explaining its position. It reads in part:
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software—which does not exist today—would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
The main argument: Is this a backdoor to one iPhone or all of them?.
Would the software the FBI is requesting be considered a "backdoor"?
That depends on whom you ask. For example, Bruce Schneier of Harvard’s Berkman Center for Internet and Society told our colleagues at NetworkWorld, “The FBI is asking Apple to reinstall a vulnerability they fixed.” He says the iPhone 5c didn’t intially have protection against brute-force attacks to guess the passcode, but those were added in 2014 with iOS 8.
The government’s February 19 court filing definitely disagrees that it’s a backdoor, mostly because the order is written just for this phone.
Apple may maintain custody of the software, destroy it after its purpose under the Order has been served, refuse to disseminate it outside of Apple, and make it clear to the world that it does not apply to other devices or users without lawful court orders. As such, compliance with the Order presents no danger for any other phone, and is not "the equivalent of a master key, capable of opening hundreds of millions of locks."
But Apple believes that it is—that “master key” quote is right from Apple’s open letter.
Whether it's a backdoor or not, the FBI says they only want to use it once. So what's wrong with a single-use backdoor?
The DOJ is saying that the FBI only wants to do this once, that’s true. But the February 19 filing uses several other court cases as precedent to bolster its argument that Apple is being unreasonable to refuse this time. In both this San Bernardino investigation and a separate drug case in the state of New York, the government is saying that since Apple helped before, they should be willing to help again.
So it’s a little weird that the FBI wants us to believe that once Apple builds this tool to assist law enforcement to brute-force a passcode, that it wouldn’t be used again. Even if that particular software image file was never shared and promptly destroyed, the courts could use this case as precedent to order Apple to build it again.
But the government says that this software doesn’t ever have to leave the Apple campus—what’s wrong with that?
The government claims that Apple can retain total control over the software, and even the device itself. Reads the February 19 filing, “the Order permits Apple to take possession of the subject device to load the programs in its own secure location, similar to what Apple has done for years for earlier operting systems, and permit the goverment to make its passcode attempts via remote access.”
But since Apple is being asked to create a tool for law enforcement to use, that tool would have to stand up to scrutiny if any evidence collected with it is ever used in court. Jonathan Zdziarski’s excellent blog post “Apple, FBI, and the Burden of Forensic Methodology” explains this really well. Zdziarski has extensive experience in iOS forensics, working with law enforcement and testifying as an expert in court.
He explains that tools used by law enforcement to collect evidence are legally known as “instruments,” and for evidence collected by such tools to be admissable in court, the court as well as the defense must have confidence that the tools are accurate and their results reproducable. New instruments—a breathalyzer, a speed-detecting radar gun, or a software tool like this one—have to be tested and validated by a third party like the NIST (National Institute of Standards and Technology) or NIJ (National Institute of Justice), and generally accepted by the scientific community. That’s why breathalyzer tests are admissable but polygraphs are not.
Zdziarski also explains how before iOS 8, when Apple could still extract unencrypted data from a locked device, this was seen as a lab service, not an instrument. In that case, Apple would have to demonstrate to the court (usually through expert testimony or an affadavit) that it had the expertise to run the test, but it could claim “trade secrets” to avoid detailing the exact methods. But when it’s law enforcement carrying out the method itself, the standard is different.
Now, just because evidence collected by use of this tool might not be admissable in court doesn’t make that evidence worthless. Law enforcement could learn something about Farook on his iPhone that they could then verify through other means that areadmissable.
The iCloud problem
Back to Farook’s iPhone 5c, is there any other way to get the evidence the government wants? What else did they try?
The February 19 filing lists the other methods the government and Apple discussed, and why they won’t work, in a footnote on page 18, paraphrased here:
• Obtain cell phone toll records: The filing says “the government has of course done this,” but it’s insufficient since there’s a lot more on the phone than call and SMS records.
• Determine if any computers were paired to the phone: The government says there weren’t any.
• Attempt an auto-backup of the device with the related iCloud account: This didn’t work because neither the FBI nor the “owner” (the San Bernardino County Department of Public Health) knew the iCloud password.
• Obtain previous iCloud backups: The FBI did this too, but the most recent backup was October 19, 2015, but the filing says that’s not sufficient “and also back-ups do not appear to have the same amount of information as is on the phone itself.”
But that third method (attempt an auto-backup to iCloud) is where it gets really weird. The iCloud password was reset remotely, shortly after the crime, by the owner, i.e. the county. The February 19 filing says, “that had the effect of eliminating the possibility of an auto-backup.”
As explained by Ars Technica, they way they tried to force it was to take the iPhone to a known Wi-Fi network, plug it in, and leave it overnight—which should trigger a backup to iCloud if auto-backups are enabled. But it didn’t work because the password had been reset so recently.
Leave a Comment
Let Millions of People know about your Product(s) / Service(s)
Target a Specific Audience on different News Categories
Our Partners
"Making the simple complicated is commonplace; making the complicated simple, awesomely simple, that's creativity"
- Charles Mingus